Goto

Collaborating Authors

 hyperparameter optimization


DP-HyPO: An Adaptive Private Hyperparameter Optimization Framework

Neural Information Processing Systems

In contrast, in non-private settings, practitioners commonly utilize "adaptive" hyperparameter optimization methods such as Gaussian process-based optimization, which select the next candidate based on information gathered from previous outputs. This substantial contrast between private and non-private hyperparameter optimization underscores a critical concern. In our paper, we introduce DP-HyPO, a pioneering framework for "adaptive"








LearningtoMutatewithHypergradientGuided Population

Neural Information Processing Systems

Toaddress theabovechallenges, wepropose anovelhyperparameter mutation (HPM) scheduling algorithm in this study, which adopts a population based training framework to explicitly learn a trade-off (i.e., a mutation schedule) between using the hypergradient-guided local search and the mutation-driven global search.